Adaptive training set reduction for nearest neighbor classification
نویسندگان
چکیده
منابع مشابه
Adaptive training set reduction for nearest neighbor classification
5 The research community related to the human-interaction framework is becoming increasingly more interested in interactive pattern recognition, taking direct advantage of the feedback information provided by the user in each interaction step in order to improve raw performance. The application of this scheme requires learning techniques that are able to adaptively re-train the system and tune ...
متن کاملAdaptive Metric nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisqu...
متن کاملDiscriminant Adaptive Nearest Neighbor Classification
Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to estimate an effective metric for computing neighborhoods. We determine the local decision b...
متن کاملProblem Set 1 K-nearest Neighbor Classification
In this part, you will implement k-Nearest Neighbor (k-NN) algorithm on the 8scenes category dataset of Oliva and Torralba [1]. You are given a total of 800 labeled training images (containing 100 images for each class) and 1888 unlabeled testing images. Figure 1 shows some sample images from the data set. Your task is to analyze the performance of k-NN algorithm in classifying photographs into...
متن کاملAdaptive Kernel Metric Nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2014
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2014.01.033